# Pre-training Fine-tuning
Multi2convai Quality De Bert
MIT
This is a Bert model optimized for German, focusing on text classification tasks in the quality domain.
Text Classification
Transformers German

M
inovex
116
0
Videomae Base Finetuned Kinetics Finetuned Dcsass Shoplifting Subset
A video classification model based on the VideoMAE architecture, fine-tuned specifically for shoplifting behavior detection
Video Processing
Transformers

V
Abdullah1
23
0
Primarygleasonbert
Bio_ClinicalBERT is a clinical text processing model based on the BERT architecture, specifically optimized for biomedical and clinical domain texts.
Text Classification
Transformers

P
jkefeli
30
2
Codet5 Base Codexglue Sum Python
Bsd-3-clause
This is a code summarization generation model based on the CodeT5-base architecture, fine-tuned using the Python portion of the CodeXGLUE dataset.
Text Generation
Transformers

C
Salesforce
58
8
Randeng Pegasus 523M Summary Chinese V1
A Chinese PEGASUS-large model specialized in text summarization tasks, fine-tuned on multiple Chinese summarization datasets
Text Generation
Transformers Chinese

R
IDEA-CCNL
95
5
Randeng Pegasus 523M Summary Chinese
A Chinese PEGASUS-large model specialized in text summarization tasks, fine-tuned on multiple Chinese summarization datasets
Text Generation
Transformers Chinese

R
IDEA-CCNL
9,549
58
Tapex Large Finetuned Sqa
Apache-2.0
TAPEX-large is a large-scale language model pre-trained on tabular data, specifically fine-tuned for table question answering tasks. It achieves table understanding through a neural SQL executor and can answer natural language questions about table content.
Question Answering System
Transformers English

T
nielsr
30
0
Tapex Large Finetuned Wikisql
Apache-2.0
TAPEX-large is a table QA model fine-tuned on the WikiSQL dataset, capable of understanding natural language questions and generating SQL queries or direct answers based on tabular data.
Question Answering System
Transformers English

T
nielsr
27
0
Wangchanberta Base Att Spm Uncased Finetune Qa
A Thai Q&A system fine-tuned based on the WangchanBERTa model, trained using multiple Thai QA datasets
Question Answering System
Transformers

W
cstorm125
30
0
Bart R3f
A Korean dialogue summarization model based on the BART pre-trained model, trained with R3F technology, participating in the 2021 Hunminjeongeum Korean Speech & Natural Language AI Competition
Text Generation
Transformers Korean

B
alaggung
135
6
Bert Base Arabic Camelbert Mix Ner
Apache-2.0
An Arabic named entity recognition model fine-tuned based on CAMeLBERT Mix, supporting entity recognition in Modern Standard Arabic, dialects, and Classical Arabic
Sequence Labeling
Transformers Arabic

B
CAMeL-Lab
24.24k
13
Reasonbert RoBERTa
A pre-trained model based on the RoBERTa architecture, optimized for tasks like question answering, with enhanced reasoning capabilities.
Large Language Model
Transformers

R
Anonymous
13
0
Danish Bert Botxo Ner Dane
This is a Danish pre-trained BERT model developed by Certainly (formerly BotXO), later fine-tuned by Malte Højmark-Bertelsen on the DaNE dataset for named entity recognition tasks.
Sequence Labeling Other
D
Maltehb
594
4
Bert Base Arabic Camelbert Mix Sentiment
Apache-2.0
A sentiment analysis model fine-tuned based on the CAMeLBERT mixed model, supporting Arabic text sentiment classification
Text Classification
Transformers Arabic

B
CAMeL-Lab
108.27k
6
Featured Recommended AI Models